The new ALICE DQM client: a web access to ROOT-based objects

نویسندگان

  • B von Haller
  • F Carena
  • W Carena
  • S Chapeland
  • V Chibante Barroso
  • F Costa
  • C Delort
  • E Dénes
  • R. Divià
  • U Fuchs
  • J Niedziela
  • G Simonetti
  • C Soós
  • A Telesca
  • P Vande Vyvre
چکیده

A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The online Data Quality Monitoring (DQM) plays an essential role in the experiment operation by providing shifters with immediate feedback on the data being recorded in order to quickly identify and overcome problems. An immediate access to the DQM results is needed not only by shifters in the control room but also by detector experts worldwide. As a consequence, a new web application has been developed to dynamically display and manipulate the ROOT-based objects produced by the DQM system in a flexible and user friendly interface. The architecture and design of the tool, its main features and the technologies that were used, both on the server and the client side, are described. In particular, we detail how we took advantage of the most recent ROOT JavaScript I/O and web server library to give interactive access to ROOT objects stored in a database. We describe as well the use of modern web techniques and packages such as AJAX, DHTMLX and jQuery, which has been instrumental in the successful implementation of a reactive and efficient application. We finally present the resulting application and how code quality was ensured. We conclude with a roadmap for future technical and functional developments. 1. ALICE ALICE [1] is a general-purpose detector dedicated to the study of heavy-ion collisions at the CERN LHC. It is optimized to study the properties of the deconfined state of quarks and gluons produced in such collisions known as quark-gluon plasma [2]. It is also well-suited to study elementary collisions such as proton-proton and proton-nucleus interactions. Robust tracking and particle identification in a wide pT range are ensured by different types of detectors designed to cope with high particle multiplicities. The commissioning of the experiment was carried out during 2008 and 2009 in the underground experimental pit. Since the startup of the LHC in November 2009 and till the long shutdown in February 2013, the experiment has been successfully taking data, almost continuously. 21st International Conference on Computing in High Energy and Nuclear Physics (CHEP2015) IOP Publishing Journal of Physics: Conference Series 664 (2015) 062064 doi:10.1088/1742-6596/664/6/062064 Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI. Published under licence by IOP Publishing Ltd 1 Publisher Publisher Agents Publisher Publisher Clients Monitor Objects Pool Publish data Access data Notifications (DIM) 1010 0010 11 1010 0010 11 MonitorObjects

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A model for specification, composition and verification of access control policies and its application to web services

Despite significant advances in the access control domain, requirements of new computational environments like web services still raise new challenges. Lack of appropriate method for specification of access control policies (ACPs), composition, verification and analysis of them have all made the access control in the composition of web services a complicated problem. In this paper, a new indepe...

متن کامل

A popularity-based prediction model for web prefetching - Computer

A challenging problem for Internet computing is how to reduce Web access latencies. With the increase in both server and client types, the number of unique file objects cached in client browsers continues to multiply, and Web access behavior is becoming more erratic. Approaches that rely solely on caching offer limited performance improvement because it is difficult for caching to handle the la...

متن کامل

QoS-Based web service composition based on genetic algorithm

Quality of service (QoS) is an important issue in the design and management of web service composition. QoS in web services consists of various non-functional factors, such as execution cost, execution time, availability, successful execution rate, and security. In recent years, the number of available web services has proliferated, and then offered the same services increasingly. The same web ...

متن کامل

Image flip CAPTCHA

The massive and automated access to Web resources through robots has made it essential for Web service providers to make some conclusion about whether the "user" is a human or a robot. A Human Interaction Proof (HIP) like Completely Automated Public Turing test to tell Computers and Humans Apart (CAPTCHA) offers a way to make such a distinction. CAPTCHA is a reverse Turing test used by Web serv...

متن کامل

Coherence Models for Web Object-Based Systems

In order to overcome the high latencies involved with accessing remote Web objects, caching is often used. Unfortunately there are a number of problems with current caching mechanisms. First of all, the client/server model used in the Web does not scale well. Second, the replication and coherence strategies used are implemented at the client side according to a \one policy ts all" principle and...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015